Minggu, 15 April 2012

Midterm e-commerce

e commerce MIDTERM

Fareza Novandro R.

0910223109

Yahoo's Guidelines for Webmasters

How do I add my site to your search engine database?
Yahoo! Search crawls the web every 2-4 weeks and automatically finds new content for indexing. If pages already in the Yahoo! Search index link to your site, it will be considered for inclusion in the next update of the index. Getting your site listed in major directory services such as the Yahoo! Directory and DMOZ is an excellent way to be sure that there are links to your site.
Pages Yahoo! Wants Included in Its Index
* Original and unique content of genuine value
* Pages designed primarily for humans, with search engine considerations secondary
* Hyperlinks intended to help people find interesting, related content, when applicable
* Metadata (including title and description) that accurately describes the contents of a Web page
* Good Web design in general
Unfortunately, not all Web pages contain information that is valuable to a user. Some pages are created deliberately to trick the search engine into offering inappropriate, redundant or poor-quality search results; this is often called "spam." Yahoo! does not want these pages in the index.
What Yahoo! Considers Unwanted
* Pages that harm accuracy, diversity or relevance of search results
* Pages dedicated to directing the user to another page
* Pages that have substantially the same content as other pages
* Sites with numerous, unnecessary virtual hostnames
* Pages in great quantity, automatically generated or of little value
* Pages using methods to artificially inflate search engine ranking
* The use of text that is hidden from the user
* Pages that give the search engine different content than what the end-user sees
* Excessively cross-linking sites to inflate a site's apparent popularity
* Pages built primarily for the search engines
* Misuse of competitor names
* Multiple sites offering the same content
* Pages that use excessive pop-ups, interfering with user navigation
* Pages that seem deceptive, fraudulent or provide a poor user experience
How do I improve the ranking of my web site in the search results?
Generally, you can ensure you have unique and targeted content for your audience. Here are a few tips that can help your page be found by a focused search on the Internet:
* Think carefully of the key terms that your users will search on to find content like yours. Use those terms to guide the text and construction of your page.
* Users are more likely to click a link if the title matches their search. Choose terms for the title that match the concept of your document.
* Use a "description" meta-tag and write your description accurately and carefully. After the title, the description is the most important draw for users. Make sure the document title and description attract the interest of the user but also fit the content on your site.
* Use a "keyword" meta-tag to list key words for the document. Use a distinct list of keywords that relate to the specific page on your site instead of using one broad set of keywords for every page.
* Keep relevant text and links in HTML. Placing them in graphics or image maps means search engines can't always search for the text, and the crawler can't follow links to your site's other pages. An HTML site map, with a link from your welcome page, can help make sure all your pages are crawled.
* Use ALT text for graphics. It's good page design to accommodate text browsers or visually impaired visitors, and it helps improve the text content of your page for search purposes.
* Correspond with webmasters and other content providers and build rich linkages between related pages. Note: "Link farms" create links between unrelated pages for no reason except to increase page link counts. Using link farms violates Yahoo!'s Site Guidelines and will not improve your page ranking.
referensi :
google, nanangsuryadi

Google Webmaster Guidelines

Webmaster Guidelines

design and content guidelines
    • Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.

    • Offer a site map to your users with links that point to the important parts of your site. If the site map has an extremely large number of links, you may want to break the site map into multiple pages.

    • Keep the links on a given page to a reasonable number.

    • Create a useful, information-rich site, and write pages that clearly and accurately describe your content.

    • Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.

    • Try to use text instead of images to display important names, content, or links. The Google crawler doesn't recognize text contained in images. If you must use images for textual content, consider using the "ALT" attribute to include a few words of descriptive text.

    • Make sure that your <title> elements and ALT attributes are descriptive and accurate.

    • Check for broken links and correct HTML.

    • If you decide to use dynamic pages (i.e., the URL contains a "?" character), be aware that not every search engine spider crawls dynamic pages as well as static pages. It helps to keep the parameters short and the number of them few.

  • Review our recommended best practices for images and video.
technical guidelines
    • Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.


    • Allow search bots to crawl your sites without session IDs or arguments that track their path through the site. These techniques are useful for tracking individual user behavior, but the access pattern of bots is entirely different. Using these techniques may result in incomplete indexing of your site, as bots may not be able to eliminate URLs that look different but actually point to the same page.


    • Make sure your web server supports the If-Modified-Since HTTP header. This feature allows your web server to tell Google whether your content has changed since we last crawled your site. Supporting this feature saves you bandwidth and overhead.


    • Make use of the robots.txt file on your web server. This file tells crawlers which directories can or cannot be crawled. Make sure it's current for your site so that you don't accidentally block the Googlebot crawler. Visit http://code.google.com/web/controlcrawlindex/docs/faq.html to learn how to instruct robots when they visit your site. You can test your robots.txt file to make sure you're using it correctly with the robots.txt analysis tool available in Google Webmaster Tools.


    • Make reasonable efforts to ensure that advertisements do not affect search engine rankings. For example, Google's AdSense ads and DoubleClick links are blocked from being crawled by a robots.txt file.


    • If your company buys a content management system, make sure that the system creates pages and links that search engines can crawl.


    • Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.




  • Monitor your site's performance and optimize load times. Google's goal is to provide users with the most relevant results and a great user experience. Fast sites increase user satisfaction and improve the overall quality of the web (especially for those users with slow Internet connections), and we hope that as webmasters improve their sites, the overall speed of the web will improve.Google strongly recommends that all webmasters regularly monitor site performance using Page Speed, YSlow, WebPagetest, or other tools. For more information, tools, and resources, see Let's Make The Web Faster. In addition, the Site Performance tool in Webmaster Tools shows the speed of your website as experienced by users around the world.
referensi:
google, nanang suryadi