Search engine optimization (SEO) is the attempt to get Google and Yahoo and Microsoft to notice your website and give it a good position (rather than showing up on the 20th page of results).
If you are making "yellow plastic sharks", then you want anyone who enters "yellow plastic sharks" into a web search engine to find you. Preferably on the first few pages of results. That is what search engine optimizations does.
The various search engines use programs called "spiders" or "crawlers" that read and catalog everything they find on the web. Getting "found" by the search engines isn't a problem, the spiders will find any new website in a matter of weeks. The problem is getting near the top of the search engine list. Getting a good ranking is what search engine optimization is all about.
These "spider" programs use secret algorithms that decide how valuable your website is and how much information about "yellow plastic sharks" your website has. If the "spider" decides that your website is important and has good "yellow plastic shark" info, then you will get a good search position and your potential clients will have an easy time finding you.
Search engine optimization is a large business. There are lots of optimization companies that will take your money and do nothing but suggest modifications to your existing website in order to optimize search rankings. Much of it is snake oil silliness. The search engines will not publish how the spiders decide to rank a website. If they did, everyone would "game the system". It is as secret as the recipe for Coca-Cola or Colonel Sander's formula of 11 herbs and spices. But, most agree that modern spiders are looking for (in approximate order of importance: #1 most important, #14 almost worthless):
1) Text content, both keywords and quantity
2) Incoming links from real (not link farm) websites
3) Traffic, visitors to your web pages
4) Responsive coding for mobiles
5) Good descriptive page titles and file names
6) Age of website
7) Secure (SSL) hosting
8) Date page was last updated
9) Robot text file and site map
10) Alt tag descriptions on photographs
11) Handicap compliant code (WCAG2 and Aria)
12) Meta tags
13) Lack of broken links
14) W3C compliant code
Lots of folks would argue about the exact order we've used in the above optimization list. Remember that we are all guessing at a secret. However, by optimizing the above features on your website, you will get far better search engine placement. Lets look at each optimization item in more depth.
1) Text content, both keywords and quantity: The type you are currently reading (and almost all of the words on our website) are "non-image text", also called "HTML text". Search engine spiders can read and rate these words. HTML text is the most valuable factor in how the spiders rank your website! Using our "yellow plastic shark" example, the spiders will read a website and consider "keywords" in your text. Keywords are anything your customers might type into a search engine while looking for your company. You want your HTML text to contain these keywords a number of times. So, to optimize your keywords you would want to make sure that "yellow plastic shark" was repeatedly used in your text. You might also want to make sure that words such as "toy" and "fish" were repeatedly used. Optimization of keywords should not be taken to extremes. If a keyword use is above 6% of the total words, it will be hard for customers to read. It will seem grammatically incorrect.
Where you use keywords is also ranked by the spiders. Keywords on your homepage tend to be more valuable than keywords on interior pages. Keywords using "header tags" (larger bold words, see the top few lines of this page) are considered more valuable than the smaller "paragraph text" you are currently reading. Keywords at the top of a page have slightly more weight than words at the bottom of a page.
The quantity of words and pages is also important to the search engine spiders. If you have 2 sentences on your "yellow plastic shark" homepage and your competitor has 4 paragraphs, well the "yellow plastic shark" market is quite competitive and your competitor will probably be higher in the search engine ranking than you. Likewise, if you have 4 pages of information and your competitor has 20 pages, you will get a lower ranking.
The way most HTML text works is that the programing on your site sends out the coding for the letters and "requests" a font from the viewer's computer. If the website is well programmed, it will actually request a family of fonts, just in case the viewer's computer doesn't have the exact font requested. A typical example would be asking the viewer's computer to use your copy of Arial. If that isn't available on your computer, then it asks for Helvetica. If that isn't available on your computer, then it asks for your default san-serif font.
What that means to the client and the web designer is that we have to give up some control. Because we are using the viewer's fonts, we can not control the layout of the text in the exacting way expected in printed media.
When you need exact control of font type and placement, you may want to use image text. Unfortunately, image text is unreadable by search engine spiders.
2) Incoming links from real (not link farm) sites: Along with the search engine spider reading what it can on your site, it considers how many other sites link back to your site. If you can get other sites (or blogs and posts) to link back to your site, it will help significantly with your search engine optimization and ranking. If the sites linking back to your site are in related industries and/or have high rankings themselves, it helps even more. If our "yellow plastic shark" site could get toy stores or toy distributors to include a link to our site, it would be a big optimization help.
You should beware of using "link farms". These are sites that (usually) charge you a small fee to put your link on their service. These websites are nothing but a long list of links. The search engine spiders know what these sites are. At best, you've wasted your money on the link farm. At worst, many believe the the search engines rank you lower for using these sites.
3) Traffic: Traffic is simply the amount of people viewing your site. Yes, this is a frustrating one! It is hard to get traffic without a good search engine ranking, and they won't give you that ranking until you have traffic. This is one of the primary reasons that you need a marketing plan beyond the web. Particularly with "start up" companies, you need to "drive" people to your website. This can be done with print advertising, newspaper ads, word of mouth and many other forms of non web advertising.
4) Responsive coding for mobiles: Sites can be "static", the same code is used no matter what device you view the pages on. This type of coding now gets you a lower search engine ranking. The search engines want all sites to either have a completely separate mobile site (wonderfully versatile, but also very expensive) or have "responsive" coding. "Responsive" means that images change size depending on the screen sized used. It also means that columns that display horizontally on a large computer screen are re-arranged to vertical on smaller (mobile) screens. Lastly, primary navigation is typically changed from a "horizontal buttons" based system on large computer screens, to a "hamburger" system of 100% width list. This allow each column and nav button seen on a mobile screen to be 100% of the screen width, making the contents much easier to read on a small (mobile) screen.
5) Good descriptive page titles and file names: Page names appear at the very top of many web browser windows. For this page you may see "Search Engine Optimization - SEO - Ferguson Photography and Design Simi Valley, CA" which describes what our page is about, who we are and where we are. Because search engine spiders can read this, it does help in search engine rankings. File names are less obvious to the viewer, but are read by the spiders. Everything from the page's technical file name (seo.php for this page) to the technical file name of the compass image on this page (search.jpg) can be read by the spiders and helps optimization.
6) Age of site: There is frustratingly little a business owner or designer can do to optimize this factor. The search engine spiders know when they first found a site and they consider an older site more valuable than a newer site. Luckily it isn't the most valuable criteria, but it is part of the equation.
7) Secure hosting: Web host are available in "http" or "https" varieties. "Https" is a far more secure hosting service and required if you are taking credit card info or personal info on your website. Not surprisingly, it is also more expensive. Many browsers now display a warning when visiting a non-https site, making the viewer feel uncomfortable even if you are not taking credit card info or personal info. Search engines now give you a slightly better ranking if you have purchased the htttps hosting.
Note: The above items #1 - #7 probably make up 90% of your optimization. While the below items #8 - #14 are still important, the search engine spiders give them far less weight.
8) Date page was last updated: Just as search engines know the age of your site (see #6 above) they know the last time each of your pages have been significantly change. The difference between this and #6 is that the spiders give you a better ranking if you significantly update your pages on a regular basis. A page that gets new material every few months is considered more important than a page that gets new material once a year and both of those are considered more important than a page that has been left unchanged for many years. Because the spiders typically consider your homepage more important than interior pages, regularly changing the info (HTML text) on your homepage helps with your optimization.
9) Robot text file and site map: These are invisible files that tell the search engine spiders the "architecture" of your site and what parts of the site you do (or do not) want the spiders to look at and rank. Because they make the spiders job "a little bit" easier, they also optimize your site and get you "a little bit" better rating. Perhaps more importantly, they allow you to put parts of your website off limits to the spiders. If you have pages or folders that are only intended for your existing customers, or are in any way private, these can be excluded from the search engines.
10) Alt tag descriptions on photographs: These are special tags (coding) intended to help the vision impaired to use the web. Along with being important to the vision impaired, they are required for a page to be handicap compliant (see #11 below) and W3C compliant (see #14 below) and they are read by the search engine spiders. That gives us one more place to put keywords and help our search engine optimization. Alt tags are far less important than regular HTML text (see #1 above).
11) Handicapped helpful code: There is almost no official requirements for handicapped accessibility on most websites from the American government. If you do business with the government, or do significant business in Europe, then regulations may start to be an issue.
Unfortunately, the lack of American regulations on handicapped accessibility for most websites hasn't stopped lawyers from claiming regulations about "public accommodations" (hotels and restaurants for example) should also apply to websites. These are often "nuisance suits" demanding an amount of cash for the lawyer to just "go away".
Websites designed "close to" being compliant with the European Union's "WCAG2 level A standards" and "Aria" standards get you both a slightly better search engine ranking and "some cover" from "nuisance suits".
12) Meta tags: These are invisible bits of computer code inside your website. Originally they were intended to allow the website designer to talk directly to the spiders and explain what was on the site. Great idea! Unfortunately they were so misused that the spiders now all but ignore them. About 20 years ago it was common practice for web designers to put a phrases such as "Pamela Anderson" or "Sexy Girls" into the invisible meta tags a dozen times. That drove lots of "hits" to the website and the designer could say to the client "look how popular the new site I did for you is". Of course, the people going to the site were not interested in "yellow plastic sharks", so all the "hits" were a waste of bandwidth.
Some search engines still use the meta description tag as the "text" shown in a search result. That tag is still important, but only because viewers who find your site in the search engine may see that text. No current spider considers the content of meta tags an important deciding factor in how your site gets ranked.
13) Lack of broken links: Even if you give the spiders a robot file (see #9 above), the spiders will still try and follow every internal and external link on every page they have permission to catalog. If the spider finds nonworking links, you will get a slightly lower ranking. Broken links can be within your site, or to external sites. Our "yellow plastic shark" site might include a link to "Joe's Toys", a store that sells our sharks. If Joe goes out of business, or changes his website address, it is best for our search engine optimization to fix or remove that broken link on our site.
14) W3C compliant code: The W3C is the international standards organization for the entire WWW and is made up of members from most big web companies (Microsoft, Apple, Mazola/FireFox, AOL, Cisco, Sun and about 430 others). Perhaps the W3C's most important function is setting standards on how the actual computer code that creates a website should be written. The "closer" we stay to using W3C compliant code, the better the spiders will rank your site.
Notice that we used the word "closer" in the previous sentence, it is often a bad idea to insist on being 100% compliant with W3C standards. If you've read the browser discussion on our statistics page, you may remember the mention that the Microsoft's popular Internet Explorer browser has "a frightening number of idiosyncrasies". Even though Microsoft is a member of W3C, they sometimes ignore the rules! That means that we sometimes have to bend or break the rules too. To be fair, all browsers ignore some of the W3C guidelines, Internet Explorer just ignores the most.