Blog

Author Archive


Spam Generated by Users and its Effects on SEO

Posted by:

Optimizing websites have become a common trend these days where webmasters lay lot of emphasis on content created. They are constantly planning on how to achieve good ranks as well as including original content within the website. However, there lies the big problem of spam generated by users which can high impact on SEO initiatives.

Renowned Google engineer, Matt Cutts stated that spam generated by users can arrive in various forms. Spammy profiles and spammy posts can easily attack different forums. Blog comment is another example. Links as well as keywords fill up the comments along with poor quality content. These things can have adverse affects on a website and hurt SEO tactics to huge extent.

But there are ways through which one can prevent a website or forum from such attacks through careful planning and research. Posts, comments and profiles of spammy nature should be deleted from the website. Comment filters within blogs as well as forums help with spammy terms blockage. Inclusion of CAPTCHA within blogs makes it difficult for bots to post spammy contents automatically. It prevents the reappearance of spammy comments and helps the website to maintain better rank on search pages. Moderators are quite frequently used these days which are quite effective in tracking posts or comments which get through to the website despite the security measures being put in place. Although, it seems as an annoying task but webmasters need to do lot of legwork to keep their websites up to date and free from spam attacks.

New SEO related questions dealt by Mat Cutts

Posted by:

In his last webmaster video Matt Cutts discussed some questions related to Penalty Link Examples, SEO and Webmaster Tools.

It is quite important to fully understand the strategy of links when dealing with a website. To put it simply, one should be fully aware of the sites that are forming a link with them. Higher PageRank matters a lot in this case and one should see that they are teaming up with quality websites if they are planning for a successful SEO. Webmaster Tools, in this case, won’t be commenting about the incoming links but it’ll get reflected in the site’s ranking. But the software will be showing a notice if an action is taken by the Webspam team. It will be giving examples of the penalised links. However this won’t be happening always, putting the site managers in dire straits.

The example links are not always comprehensive as a particular website might contain numerous offensive links. The ones listed help in putting everything on track and to recognise the trouble-making links. Cuts added that it is next to impossible to notify every offensive link as the number is huge.

There are plans to include a lot more details from here on in the messages and to include 2 or 3 example links that could be followed as references.

The SEO implementation process is definitely tricky and could be hard task when dealing with inbound links. A number of sites get punished for reasons unknown to them. Thus, Webmaster Tools need to be used if one is planning to maintain a successful SEO.

Content Freshness and Concealed Content in SEO

Posted by:

Search Engine Optimization for dental officeThe discussion today would be on a subject that concerns itself with website optimization and web-spam, focusing on what the expert on SEO as well as Google engineer, Mat Cutts, has to say on this.

The first part of the discussion would be regarding the safety of concealed content, which the visitor can see only when it is clicked. Regarding this, Cutts says that these kinds of contents can be seen to appear most often on the web. He cites the example of the store sites where in way of buying the products, the visitors are often required to click on buttons that show that specifications as drop down. This is done in order to maintain the tidiness of website. So these are not harmful as long as they are designed for the sole use of the visitor. In fact they are quite safe to be clicked.

However, if there occurs a very tiny button, which is hardly visible to a visitor and has many pages in it, is held illegal by Google. These sites are consequently sent down in the rankings on search results.

The next part of the discussion concerns the importance of fresh content for SEO. If, for instance, a person searches “earthquake” as a term on the search engine, he is presented with a number of results showing the mention of any current happening of such an event. Hence, such sites as will have the freshest of information will have higher ranks on the search result. However, if the term is a more generalized one and is evergreen, freshness shall not be counted while presenting the search results.

Finding Bad Links Getting Directed to the Site

Posted by:

Dental SEO OptimizationWhile building any website, SEO is used for optimizing the site which in turn puts the webmasters in all sorts of trouble like handling bad links. If the site contains several links from sites that are not legitimate pointing towards it then the rank of the site can drastically reduce. The main reason behind this is to eliminate sites using bad backlinking strategies. As a matter of fact, legitimate businesses also get punished in the process. The most unfortunate thing is that examples of “bad” links are not provided by Google when it sends reports to different sites.

People face different types of trouble related to SEO after submitting to Google their reconsideration request. They generally ask for clarification regarding which links cause the actual problem. But, surprisingly they are faced with one more denial. This is mainly because of the fact that the system processes its replies with only “No”, “Processed” and “Yes”. It is not capable of assisting with clarification required by webmasters.

Matt Cutts, Google Engineer recommends that people can visit the forum for Webmaster Tools and get answers to their questions there. Experienced webmasters are available out there capable of providing helpful feedback. This will lead to better understanding about the actual cause of problem. “Bad links” examples are also offered by the Tools which can be quite beneficial when it comes to SEO.

It has already been acknowledged by Cutts that transparency is lacking at present which can affect legitimate businesses in an adverse manner. But, the main cause behind the system being made to work this way is growing spammer numbers. It is definitely not possible to respond to every individual trying to fool the system through illegitimate tactics. Continue Reading →

Matt Cutts answers SEO related questions

Posted by:

Matt_Cuts_Answering_SEO_Questions-20130723Google’s Matt Cutts faced some “website downtime” and “voice search” questions in this week’s one-to-one.

Someone asked whether a website will be suffering a lot, particularly in its ranking and SEO, if it stays down for a period of 24 hours. The man from Google answered that it won’t be a big concern if it stays down for a few hours. But if the “hours” starts turning into “weeks” then it surely is asking for some immediate attention. As Google is always trying to maintain integrity in their performance, they won’t be leading their visitors to a malfunctioning website. Thus, the best possible way to continue their site’s ranking is to resolve these issues as early as possible.

Voice search was next in line as one was interested to know whether voice searches’ popularity influences search syntax in some way or the other. Cutts said that the process is comparably more natural here. He added that the company is trying to make sure that voice search gets better with every passing day. The search algorithms will be boiling down the search queries so as to get hold of the summary of which one is talking about. One can safely say that query syntax has gone through a lot of changes for the last few years and shall continue in doing so. Cutts also hinted that this process will continue for an indefinite period, so it should be wise for SEO professionals to perform consistently if they want to survive in this game.

Source: http://www.localseotampa.com/voice-search-search-syntax-related-questions-answered-by-matt-cutts-201307/

Attributing Source Content with SEO

Posted by:

Matt-Cutts-Google-SEOWhile writing the original contents for a blog, people always wonder as to how the sources used should be attributed. In other words, it can be said that they want to learn how to place the resources in an article that were used for writing it. Recently, one webmaster contacted Matt Cutts, Google Engineer, to know more about this issue and its importance in terms of SEO. Matt Cutts offered some answers that were really surprising.

The thing is, where the attributes are sourced does not actually matter. While building up SEO for the site and staying within the guidelines of Google Webmaster, people always think that where the links are placed within the article decides the PageRank flow and helps the site to achieve better ranks. But, that is not actually true. As a matter of fact, Matt offered the webmaster with different tips, preferable from the point of view of a reader.

Keeping that thing in mind, Matt feels that links to original source of an article (article from where the writer got the information) should be placed higher up in the post. Ideally that should be the 1st paragraph. He also stated that, one cannot just mention “According to popular auto blog…” or “According to (name of the website)….” the webmaster should place the actual link within an article so that readers are directed to source material. If this isn’t the case then, although, the source is being technically attributed but one is not concentrating towards PageRank flow. This also means that, the writer is not considerate towards the readers who are in search of additional information which is no doubt an essential part of SEO.

There is obviously no harm in placing the source link at the article’s bottom, as long as it is defined like “Source: Website” where the website is the link. However, when the article is quite big, there are slim chances that readers will scroll to its bottom which means that the source is not being acknowledged at all.

Matt Cutts advises bloggers and webmasters to acknowledge the source from where they obtained the information. This not only means citing the sources but being considerate about where the attributes are being placed within the link. The readers should find it easy to search for additional information related to the topic of an article. Responsibility on web is much more than simply SEO.

 

Considering SEO with web-spam

Posted by:

Dental SEOGoogle’s Webmasters Guidelines are an important resource for developers and administrators of websites throughout the World. It provides insight into illicit practices and beneficial techniques in SEO. There are several techniques that benefit websites and many that adversely affect. One among the latter is content violation, which reduces visibility of websites leading to decline of their respective businesses.

Syndicated content comes first in the discussion. It is such matter that is copied from one or more source and pasted on another end. Those websites that provide genuine and relevant matter are alright but those that contain information of almost every category and link to sites of any niche are very likely to be punished by Google, the largest search engine on the web.

Affiliate programme is another concern related to SEO. To rank better with content from affiliates, webmasters must supply the substance unedited. Such original contents help the website to rank better. Valid reviews or descriptions of services or products are valuable to visitors and therefore, increase the importance of the site to which they are posted.

Doorway sites are symbols of hazards and every webmaster should strive to stay away from them. The only function of doorway sites is to present many links in the search results all of which direct to a single site. Thus, the search result gets congested for specific search terms and search experience is affected.

These SEO techniques are considered violations by Google and draw penalty against the websites. The best practice is to adhere to guidelines and offer good-quality content to visitors.

Is organic ranking related to political searches influenced by Google?

Posted by:

SEO-and-ElectionsRecent statements made by some speculators suggest that Google may be in a position to influence how a person feels about political topics every time they search something related to politics.

A study was conducted recently to evaluate SEO’s role in influencing people to make decisions related to politics. So if a person is searching something online in relation to a candidate or a topic related to politics, the one ranking the highest is likely to be influencing that particular user more- whether in voting pattern or their outlook. The study also revealed that the rankings and the search results affected the voting pattern by almost 15%. This percentage is large enough in affecting the election-results.

This shows that SEO may of great importance and it is high time in paying it its due respect. Political candidates and organisations invest in SEO just like normal companies in an effort to rank their sites highly in search engines.

It is very much possible in purchasing advertisements that appear on the right side of the results or at the page’s top and all of these will definitely be getting a lot of clicks no doubt. But here, the organic ranking is the top-most treasure if one is planning to influence politics in a sort of meaningful way. Continue Reading →

Some of the most common mistakes webmasters do

Posted by:  /  Tags:

Dental SEOOver the years one of the most debated topics has been the relevance of freshness of contents. While a blog can be updated on a regular basis it is difficult for a company website for being updated on a regular basis. Recently the algorithm changes and updates by Google has expressed that the websites which are updated on a regular basis would have a boost in their search engine rankings and would rank much better. Another thing which should be avoided is publishing content at one go. It may harm a website and it would be recommended to schedule posts rather than publishing them all at once. This way webmasters can ensure that their websites are always crawled by the Search Engines which helps in ranking the articles much better.

There are a few things which webmasters usually do and most of them do it unknowingly. Some of the areas where they go wrong have been discussed below:

  • Not making the site crawlable: This is one of the most common mistakes which have been evident among the webmasters. It is advised that the pages which the webmaster wishes to be indexed by Google needs to be easily accessible by the search engines. This aspect needs to be addressed while designing the website’s architecture and navigation. It has been observed that on big websites navigation gets difficult. However, still it should be the responsibility of the webmaster to organize the contents in a manner that it’s easy for a 1st time visitor to check the content he is in search of.
  • Not including the Right Words on the Page: Matt Cutt mentions that there are webmasters who do not use the right words on their pages. He uses the comparative example of ‘Mount Everest elevation’ or ‘how high is Mount Everest’. Many would start wondering what could be the difference of using any of the words. Here comes the relevance of the Google Keyword tool. If one searchers the keyword Mount Everest Elevation then it would list that around 6,600 people search the keyword globally. However, they might be surprised to know that ‘how high is Mount Everest’ is searched about 8,100 times. Here is the difference that says that usage of the right keywords can help the contents rank better.
  • Not considering compelling Content and Marketing: This is one of the other mistakes many webmasters do without having the intent to do so. In the past it was seen that sites having lots of links were getting a boost in serps which made most of the webmasters to build more and more links. This actually led to a degradation of the search quality but with the Panda update recently all the sites with poor links got penalized. This has made it more important for the webmasters to publish rich and quality contents where the contents attract external links automatically.
  • Not Thinking About Title and Description on the Really Important Pages: Termed as On-site SEO it is very important to have titles and descriptions for a particular website and its pages. It is well known that a lot of money gets spent on designing and link building of a website. However, many webmasters tend to neglect or avoid writing descriptions and tags for their pages. It should be noted that on-site SEO can be boosted with right descriptions and titles to every page.

 

SERP subjectivity and Advertorial questions answered by Matt Cutts

Posted by:

Matt-Cutts-Google-SEO-June-2013Google’s webspam team head, Matt Cutts shared his views regarding advertorials recently in his YouTube channel.

They may look like age-old advertising but advertorials are more like editorials or natural content. The problem associated with this is that when people go through an advertorial content piece, they draw a quick conclusion that the content is editorial and it has been written by a part-time writer. Many a time, advertorials passes PageRank, which creates quite a problem as they are not disclosed that much. He stated that if one is found to be doing this then they could see a number of penalties related to SEO heading their way.

If one is thinking of avoiding the penalties, one needs to reveal the advertorial nature of the content to Google. SEO Optimization tactics can be used while doing this such as the rel=”nofollow” tag. It informs Google of not passing PageRank for it.

Next question Cutts dealt with was in regards to Google’s viewpoint on quality of search when one is relying on the subjective signals. The assumption is quite faulty in this case, he explained, as many thinks that as far as SERPs are concerned, the search query context cannot be determined by Google. But the thing is that they can determine it quite often. Human testers can deduce the context by simply looking the query and the domain. It is quite easy for them to see how these two are related, even though they are not accustomed with the matter. Continue Reading →