When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Advertiser Disclosure: Some of the products that appear on this site are from companies from which QuinStreet receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. QuinStreet does not include all companies or all types of products available in the marketplace.
Try and get links within page text pointing to your site with relevant, or at least, natural looking, keywords in the text link – not, for instance, in blogrolls or site-wide links. Try to ensure the links are not obviously “machine generated” e.g. site-wide links on forums or directories. Get links from pages, that in turn, have a lot of links to them, and you will soon see benefits.
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
QUOTE: “alt attribute should be used to describe the image. So if you have an image of a big blue pineapple chair you should use the alt tag that best describes it, which is alt=”big blue pineapple chair.” title attribute should be used when the image is a hyperlink to a specific page. The title attribute should contain information about what will happen when you click on the image. For example, if the image will get larger, it should read something like, title=”View a larger version of the big blue pineapple chair image.” John Mueller, Google
Google and Bing use a crawler (Googlebot and Bingbot) that spiders the web looking for new links to find. These bots might find a link to your homepage somewhere on the web and then crawl and index the pages of your site if all your pages are linked together. If your website has an XML sitemap, for instance, Google will use that to include that content in its index. An XML sitemap is INCLUSIVE, not EXCLUSIVE.  Google will crawl and index every single page on your site – even pages out with an XML sitemap.
Both require knowing your audience. To succeed at both strategies, you must have a good understanding of your audience and how they act. By using buyer personas and psychographic segmentation, you can get to know your audience, discover what their needs are, and what they are searching for. Then you can create valuable content that shows up when they go looking for solutions related to your brand.
Comparing your Google Analytics data side by side with the dates of official algorithm updates is useful in diagnosing a site health issue or traffic drop. In the above example, a new client thought it was a switch to HTTPS and server downtime that caused the drop when it was actually the May 6, 2015, Google Quality Algorithm (originally called Phantom 2 in some circles) that caused the sudden drop in organic traffic – and the problem was probably compounded by unnatural linking practices. (This client did eventually receive a penalty for unnatural links when they ignored our advice to clean up).
While you can often start with a keyword and create a piece of content around that term, sometimes your content already exists, and you need to figure out how to match it to keywords. To do this, create what's known as a "content to keyword map." Creating this map can help you understand the impact of your existing content and identify weak links or gaps that need filling.
QUOTE:  “Tell visitors clearly that the page they’re looking for can’t be found. Use language that is friendly and inviting. Make sure your 404 page uses the same look and feel (including navigation) as the rest of your site. Consider adding links to your most popular articles or posts, as well as a link to your site’s home page. Think about providing a way for users to report a broken link. No matter how beautiful and useful your custom 404 page, you probably don’t want it to appear in Google search results. In order to prevent 404 pages from being indexed by Google and other search engines, make sure that your webserver returns an actual 404 HTTP status code when a missing page is requested.” Google, 2018
UC Davis, one of the nation’s top-ranked research universities, is a global leader in agriculture, veterinary medicine, sustainability, environmental and biological sciences, and technology. With four colleges and six professional schools, UC Davis and its students and alumni are known for their academic excellence, meaningful public service and profound international impact.

Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.
Off-page SEO builds a website’s reputation and authority by connecting it to other high-quality websites. Off-page SEO techniques include: link building (acquiring high-quality backlinks) from other websites and managing local listings and directory profiles. When many websites link to a brand’s website, it shows search engines that the brand’s website is trustworthy, reliable, and reputable, which increases its search rankings.
Website owners recognized the value of a high ranking and visibility in search engine results,[6] creating an opportunity for both white hat and black hat SEO practitioners. According to industry analyst Danny Sullivan, the phrase "search engine optimization" probably came into use in 1997. Sullivan credits Bruce Clay as one of the first people to popularize the term.[7] On May 2, 2007,[8] Jason Gambert attempted to trademark the term SEO by convincing the Trademark Office in Arizona[9] that SEO is a "process" involving manipulation of keywords and not a "marketing service."
Google ranks websites (relevancy aside for a moment) by the number and quality of incoming links to a site from other websites (amongst hundreds of other metrics). Generally speaking, a link from a page to another page is viewed in Google “eyes” as a vote for that page the link points to. The more votes a page gets, the more trusted a page can become, and the higher Google will rank it – in theory. Rankings are HUGELY affected by how much Google ultimately trusts the DOMAIN the page is on. BACKLINKS (links from other websites – trump every other signal.)
As an Internet marketing strategy, SEO considers how search engines work, the computer-programmed algorithms that dictate search engine behavior, what people search for, the actual search terms or keywords typed into search engines, and which search engines are preferred by their targeted audience. SEO is performed because a website will receive more visitors from a search engine the higher the website ranks in the search engine results page (SERP). These visitors can then be converted into customers.[4]

QUOTE: “Another problem we were having was an issue with quality and this was particularly bad (we think of it as around 2008 2009 to 2011) we were getting lots of complaints about low-quality content and they were right. We were seeing the same low-quality thing but our relevance metrics kept going up and that’s because the low-quality pages can be very relevant. This is basically the definition of a content farm in our in our vision of the world so we thought we were doing great our numbers were saying we were doing great and we were delivering a terrible user experience and turned out we weren’t measuring what we needed to so what we ended up doing was defining an explicit quality metric which got directly at the issue of quality it’s not the same as relevance …. and it enabled us to develop quality related signals separate from relevant signals and really improve them independently so when the metrics missed something what ranking engineers need to do is fix the rating guidelines… or develop new metrics.” SMX West 2016 – How Google Works: A Google Ranking Engineer’s Story (VIDEO)
So what are the rules of thumb here? First and foremost, it’s very important that your content is easy to read. Of course, you should use your keywords in your text, but don’t stuff your keywords in almost every sentence. In general, if 1 or 2% of all words of your copy, is your keyword, then you’re not overdoing it. Make sure your keywords are well-distributed throughout your text. Don’t put all your keywords in the first paragraph thinking you’re done with that part of the optimization. Naturally spread the keywords throughout your page or post. Use your keywords in a subheading or a couple of subheadings, depending on the length of your page or post. And use the keyword in your page title, first paragraph and your meta description. You can find all of these recommendations in the SEO analysis of Yoast SEO.
QUOTE: “Content which is copied, but changed slightly from the original. This type of copying makes it difficult to find the exact matching original source. Sometimes just a few words are changed, or whole sentences are changed, or a “find and replace” modification is made, where one word is replaced with another throughout the text. These types of changes are deliberately done to make it difficult to find the original source of the content. We call this kind of content “copied with minimal alteration.” Google Search Quality Evaluator Guidelines March 2017
In 2007, Google announced a campaign against paid links that transfer PageRank.[30] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any nofollow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[31] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and JavaScript.[32]
Google and Bing use a crawler (Googlebot and Bingbot) that spiders the web looking for new links to find. These bots might find a link to your homepage somewhere on the web and then crawl and index the pages of your site if all your pages are linked together. If your website has an XML sitemap, for instance, Google will use that to include that content in its index. An XML sitemap is INCLUSIVE, not EXCLUSIVE.  Google will crawl and index every single page on your site – even pages out with an XML sitemap.
And you’ve got to diversify. Here’s a tongue-twister that’s absolutely true: diversity is a key word in the keyword world. You’re not going to stand out if you find yourself using all of the same keywords as your competitors. Not only should you try new keyword search tools and keep track of the results, but you should feel free to experiment based on your own research – who else uses your keywords? And how do you make yourself stand out? By providing great content that truly answers the questions your prospective customers are asking with their keyword searches.

The last time I looked Google displayed as many characters as it can fit into a block element that’s about 600px wide and doesn’t exceed 1 line of text (on desktop). So – THERE IS NO BEST PRACTICE AMOUNT OF CHARACTERS any SEO could lay down as exact best practice to GUARANTEE a title will display, in full in Google, at least, as the search snippet title, on every device. Ultimately – only the characters and words you use will determine if your entire page title will be seen in a Google search snippet.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Google is a link-based search engine. Google doesn’t need content to rank pages but it needs content to give to users. Google needs to find content and it finds content by following links just like you do when clicking on a link. So you need first to make sure you tell the world about your site so other sites link to yours. Don’t worry about reciprocating to more powerful sites or even real sites – I think this adds to your domain authority – which is better to have than ranking for just a few narrow key terms.

Think about how Google can algorithmically and manually determine the commercial intent of your website – think about the signals that differentiate a real small business website from a website created JUST to send visitors to another website with affiliate links, on every page, for instance; or adverts on your site, above the fold, etc, can be a clear indicator of a webmaster’s particular commercial intent – hence why Google has a Top Heavy Algorithm.


In addition to processing the text content on your web pages, Google will also try to figure out what your images are about as well. Alt Text is a short description that you can customize for each image to let Google know what the image is about. Setting short, descriptive Alt Texts through our site builder will help Google better associate your web pages with the search terms you're trying to target.
Ever since its April 2015 update, Google is now taking the user's mobile experience into consideration in its search results. This means that when a user is doing a search on a mobile device, Google's search results will favor websites that are mobile friendly over the ones that aren't. If you want to capture that mobile search audience, you will need to have a mobile version of your website in addition to your desktop version.
Use common sense – Google is a search engine – it is looking for pages to give searchers results, 90% of its users are looking for information. Google itself WANTS the organic results full of information. Almost all websites will link to relevant information content so content-rich websites get a lot of links – especially quality links. Google ranks websites with a lot of links (especially quality links) at the top of its search engines so the obvious thing you need to do is ADD A LOT of INFORMATIVE CONTENT TO YOUR WEBSITE.
But Google isn’t the only reason why keywords are important. Actually, it’s less important, because you should always focus on the user: on your visitors and potential clients. With SEO you want people to land on your website when using a certain search term or keyword. You need to get into the heads of your audience and use the words they use when they are searching.
I think the anchor text links in internal navigation is still valuable – but keep it natural. Google needs links to find and help categorise your pages. Don’t underestimate the value of a clever internal link keyword-rich architecture and be sure to understand for instance how many words Google counts in a link, but don’t overdo it. Too many links on a page could be seen as a poor user experience. Avoid lots of hidden links in your template navigation.
Basically, SEO keyword research should be an ongoing and ever-evolving part of your job as a marketer. Old keywords need to be reevaluated periodically, and high-volume, competitive keywords (or “head” keywords, as opposed to long-tailed keywords) can often be usefully replaced or augmented with longer, more specific phrases designed not to bring in just any visitor but exactly the right visitors. (Who visits your site – particularly if they’re people who are actively looking for your services – is at least as important as how many people visit.)
Try and get links within page text pointing to your site with relevant, or at least, natural looking, keywords in the text link – not, for instance, in blogrolls or site-wide links. Try to ensure the links are not obviously “machine generated” e.g. site-wide links on forums or directories. Get links from pages, that in turn, have a lot of links to them, and you will soon see benefits.
When I think ‘Google-friendly’ these days – I think a website Google will rank top, if popular and accessible enough, and won’t drop like a f*&^ing stone for no apparent reason one day, even though I followed the Google SEO starter guide to the letter….. just because Google has found something it doesn’t like – or has classified my site as undesirable one day.
It's wonderful to deal with keywords that have 50,000 searches a month, or even 5,000 searches a month, but in reality, these popular search terms only make up a fraction of all searches performed on the web. In fact, keywords with very high search volumes may even indicate ambiguous intent, which, if you target these terms, it could put you at risk for drawing visitors to your site whose goals don't match the content your page provides.
If you have original, quality content on a site, you also have a chance of generating inbound quality links (IBL). If your content is found on other websites, you will find it hard to get links, and it probably will not rank very well as Google favours diversity in its results. If you have original content of sufficient quality on your site, you can then let authority websites – those with online business authority – know about it, and they might link to you – this is called a quality backlink.

Google will INDEX perhaps 1000s of characters in a title… but I don’t think anyone knows exactly how many characters or words Google will count AS a TITLE TAG when determining RELEVANCE OF A DOCUMENT for ranking purposes. It is a very hard thing to try to isolate accurately with all the testing and obfuscation Google uses to hide it’s ‘secret sauce’. I have had ranking success with longer titles – much longer titles. Google certainly reads ALL the words in your page title (unless you are spamming it silly, of course).
SEM search results have ad extensions. SEO search results have featured snippets. When comparing SEM vs. SEO, you’ll also find differences in the appearance of the search results. SEM search results may include ad extensions, which can add on additional links, phone numbers, and callouts. On the other hand, SEO results may appear with featured snippets in search.
Think about how Google can algorithmically and manually determine the commercial intent of your website – think about the signals that differentiate a real small business website from a website created JUST to send visitors to another website with affiliate links, on every page, for instance; or adverts on your site, above the fold, etc, can be a clear indicator of a webmaster’s particular commercial intent – hence why Google has a Top Heavy Algorithm.
KISS does not mean boring web pages. You can create stunning sites with smashing graphics – but you should build these sites using simple techniques – HTML & CSS, for instance. If you are new to web design, avoid things like Flash and JavaScript, especially for elements like scrolling news tickers, etc. These elements work fine for TV – but only cause problems for website visitors.

Off-page SEO builds a website’s reputation and authority by connecting it to other high-quality websites. Off-page SEO techniques include: link building (acquiring high-quality backlinks) from other websites and managing local listings and directory profiles. When many websites link to a brand’s website, it shows search engines that the brand’s website is trustworthy, reliable, and reputable, which increases its search rankings.

I’ve always thought if you are serious about ranking – do so with ORIGINAL COPY. It’s clear – search engines reward good content it hasn’t found before. It indexes it blisteringly fast, for a start (within a second, if your website isn’t penalised!). So – make sure each of your pages has enough text content you have written specifically for that page – and you won’t need to jump through hoops to get it ranking.
UC Davis, one of the nation’s top-ranked research universities, is a global leader in agriculture, veterinary medicine, sustainability, environmental and biological sciences, and technology. With four colleges and six professional schools, UC Davis and its students and alumni are known for their academic excellence, meaningful public service and profound international impact.

While the title tag is effectively your search listing’s headline, the meta description (another meta HTML element that can be updated in your site’s code, but isn’t seen on your actual page) is effectively your site’s additional ad copy. Google takes some liberties with what they display in search results, so your meta description may not always show, but if you have a compelling description of your page that would make folks searching likely to click, you can greatly increase traffic. (Remember: showing up in search results is just the first step! You still need to get searchers to come to your site, and then actually take the action you want.)


Ever wonder how major search engines such as Google, Bing and Yahoo rank your website within their searches? Or how content such as videos or local listings are shown and ranked based on what the search engine considers most relevant to users? Welcome to the world of Search Engine Optimization (SEO). This course is the first within the SEO Specialization and it is intended to give you a taste of SEO. You will be introduced to the foundational elements of how search engines work, how the SEO landscape has changed and what you can expect in the future. You discuss core SEO strategies and tactics used to drive more organic search results to a specific website or set of websites, as well as tactics to avoid to prevent penalization from search engines. You will also discover how to position yourself for a successful career in SEO should this subject prove interesting to you. We hope this taste of SEO, will entice you to continue through the Specialization.
Thick & Unique Content – There is no magic number in terms of word count, and if you have a few pages of content on your site with a handful to a couple hundred words you won’t be falling out of Google’s good graces, but in general recent Panda updates in particular favor longer, unique content. If you have a large number (think thousands) of extremely short (50-200 words of content) pages or lots of duplicated content where nothing changes but the page’s title tag and say a line of text, that could get you in trouble. Look at the entirety of your site: are a large percentage of your pages thin, duplicated and low value? If so, try to identify a way to “thicken” those pages, or check your analytics to see how much traffic they’re getting, and simply exclude them (using a noindex meta tag) from search results to keep from having it appear to Google that you’re trying to flood their index with lots of low value pages in an attempt to have them rank.
QUOTE: “The average duration metric for the particular group of resources can be a statistical measure computed from a data set of measurements of a length of time that elapses between a time that a given user clicks on a search result included in a search results web page that identifies a resource in the particular group of resources and a time that the given user navigates back to the search results web page. …Thus, the user experience can be improved because search results higher in the presentation order will better match the user’s informational needs.” High Quality Search Results based on Repeat Clicks and Visit Duration
Google is a link-based search engine. Google doesn’t need content to rank pages but it needs content to give to users. Google needs to find content and it finds content by following links just like you do when clicking on a link. So you need first to make sure you tell the world about your site so other sites link to yours. Don’t worry about reciprocating to more powerful sites or even real sites – I think this adds to your domain authority – which is better to have than ranking for just a few narrow key terms.
Keyword analysis. From nomination, further identify a targeted list of key­words and phrases. Review competitive lists and other pertinent industry sources. Use your preliminary list to determine an indicative number of recent search engine queries and how many websites are competing for each key­word. Prioritize keywords and phrases, plurals, singulars and misspellings. (If search users commonly misspell a keyword, you should identify and use it). Please note that Google will try to correct the term when searching, so use this with care.
×